Efficient Calculation of Sensitivities for Optimization Problems

نویسندگان

  • Andreas Kowarz
  • Andrea Walther
چکیده

Sensitivity information is required by numerous applications such as, for example, optimization algorithms, parameter estimations or real time control. Sensitivities can be computed with working accuracy using the forward mode of automatic differentiation (AD). ADOL-C is an AD-tool for programs written in C or C++. Originally, when applying ADOL-C, tapes for values, operations and locations are written during the function evaluation to generate an internal function representation. Subsequently, these tapes are evaluated to compute the derivatives, sparsity patterns etc., using the forward or reverse mode of AD. The generation of the tapes can be completely avoided by applying the recently implemented tapeless variant of the forward mode for scalar and vector calculations. The tapeless forward mode enables the joint computation of function and derivative values directly from main memory within one sweep. Compared to the original approach shorter runtimes are achieved due to the avoidance of tape handling and a more effective, joint optimization for function and derivative code. Advantages and disadvantages of the tapeless forward mode provided by ADOL-C will be discussed. Furthermore, runtime comparisons for two implemented variants of the tapeless forward mode are presented. The results are based on two numerical examples that require the computation of sensitivity information.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient Optimum Design of Steructures With Reqency Response Consteraint Using High Quality Approximation

An efficient technique is presented for optimum design of structures with both natural frequency and complex frequency response constraints. The main ideals to reduce the number of dynamic analysis by introducing high quality approximation. Eigenvalues are approximated using the Rayleigh quotient. Eigenvectors are also approximated for the evaluation of eigenvalues and frequency responses. A tw...

متن کامل

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

An Efficient Conjugate Gradient Algorithm for Unconstrained Optimization Problems

In this paper, an efficient conjugate gradient method for unconstrained optimization is introduced. Parameters of the method are obtained by solving an optimization problem, and using a variant of the modified secant condition. The new conjugate gradient parameter benefits from function information as well as gradient information in each iteration. The proposed method has global convergence und...

متن کامل

An efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems

Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...

متن کامل

An Efficient Neurodynamic Scheme for Solving a Class of Nonconvex Nonlinear Optimization Problems

‎By p-power (or partial p-power) transformation‎, ‎the Lagrangian function in nonconvex optimization problem becomes locally convex‎. ‎In this paper‎, ‎we present a neural network based on an NCP function for solving the nonconvex optimization problem‎. An important feature of this neural network is the one-to-one correspondence between its equilibria and KKT points of the nonconvex optimizatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007